As the AI industry’s insatiable energy demands collide with infrastructure limits, there is mounting pressure to accelerate the deployment of nuclear energy sources, often at the expense of safety and oversight. There are major nuclear deployment efforts underway to meet this recent surge in demand, but speed comes with risk. Nuclear development timelines – often 10 to 20 years – are out of step with the pace of AI deployment. Efforts to fast-track these timelines raise serious safety and oversight concerns. The ‘AI arms race’ is increasingly being used to justify efforts to roll back long-standing nuclear safety and regulatory mechanisms that exist to protect the public, workers, and the environment.

The AI Now Institute is launching a new stream of work focused on AI’s impact on energy infrastructure, with a special focus on nuclear safety and regulatory issues. As part of that, we’re welcoming Dr. Sofia Guerra as an advisor. Dr. Guerra is a global expert in the assurance and governance of complex digital systems. She has spent decades working with the U.S. Nuclear Regulatory Commission, the UK Office for Nuclear Regulation, and the International Atomic Energy Agency to strengthen public oversight and safety in high-risk environments. 

Our new energy infrastructure initiative builds on our broader research into AI safety and governance, expanding our focus to energy security, regulatory accountability, and the concentration of power in high-stakes decision-making. At a moment when industry narratives risk eroding public protections in the name of innovation, we’re asking: Does AI, with its increasing harms and energy demand, justify the risks associated with nuclear plants and potentially “fast-tracked” regulation?.

We’ll be examining the implications of AI deployment in high-risk sectors, challenging efforts to erode public oversight, and spotlighting strategies that prioritize safety and democratic accountability.

Download